Can high-order dependencies improve mutual information based feature selection?
نویسندگان
چکیده
منابع مشابه
Can high-order dependencies improve mutual information based feature selection?
Mutual information (MI) based approaches are a popular paradigm for feature selection. Most previous methods have made use of low-dimensional MI quantities that are only effective at detecting low-order dependencies between variables. Several works have considered the use of higher dimensional mutual information, but the theoretical underpinning of these approaches is not yet comprehensive. To ...
متن کاملConditional Dynamic Mutual Information-Based Feature Selection
With emergence of new techniques, data in many fields are getting larger and larger, especially in dimensionality aspect. The high dimensionality of data may pose great challenges to traditional learning algorithms. In fact, many of features in large volume of data are redundant and noisy. Their presence not only degrades the performance of learning algorithms, but also confuses end-users in th...
متن کاملFeature Selection Based on Joint Mutual Information
A feature/input selection method is proposed based on joint mutual information. The new method is better than the existing methods based on mutual information in eliminating redundancy in the inputs. It is applied in a real world application to nd 2-D viewing coordinates for data visualization and to select inputs for a neural network classiier. The result shows that the new method can nd many ...
متن کاملEfficient High-Order Interaction-Aware Feature Selection Based on Conditional Mutual Information
This study introduces a novel feature selection approach CMICOT, which is a further evolution of filter methods with sequential forward selection (SFS) whose scoring functions are based on conditional mutual information (MI). We state and study a novel saddle point (max-min) optimization problem to build a scoring function that is able to identify joint interactions between several features. Th...
متن کاملHigher Order Mutual Information Approximation for Feature Selection
Feature selection is a process of choosing a subset of relevant features so that the quality of prediction models can be improved. An extensive body of work exists on information-theoretic feature selection, based on maximizing Mutual Information (MI) between subsets of features and class labels. The prior methods use a lower order approximation, by treating the joint entropy as a summation of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition
سال: 2016
ISSN: 0031-3203
DOI: 10.1016/j.patcog.2015.11.007